Bounds on Non-Symmetric Divergence Measures in Terms of Symmetric Divergence Measures
نویسنده
چکیده
There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are Kullback-Leibler [13] relative information and Jeffreys [12] Jdivergence. Sibson [17] Jensen-Shannon divergence has also found its applications in the literature. The author [20] studied a new divergence measures based on arithmetic and geometric means. The measures like harmonic mean divergence and triangular discrimination [6] are also known in the literature. Recently, Dragomir et al. [10] also studies a new measure similar to J-divergence, we call here the relative J-divergence. Another measures arising due to Jensen-Shannon divergence is also studied by Lin [15]. Here we call it relative Jensen-Shannon divergence. Relative arithmetic-geometric divergence (Taneja [24]) is also studied here. All these measures can be written as particular cases of Csiszár f-divergence. By putting some conditions on the probability distribution, the aim here is to obtain bounds among the measures.
منابع مشابه
Relative Divergence Measures and Information Inequalities
There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are Kullback-Leiber’s [17] relative information and Jeffreys [16] J-divergence, Information radius or Jensen difference divergence measure due to Sibson [23]. Burbea and Rao [3, 4] has also found its applications in the literature. Taneja [25] studied anoth...
متن کاملGeneralized Symmetric Divergence Measures and the Probability of Error
Abstract There are three classical divergence measures exist in the literature on information theory and statistics. These are namely, Jeffryes-Kullback-Leiber [6], [8] J-divergence. SibsonBurbea-Rao [9], [3] Jensen-Shannon divegernce and Taneja [11] Arithmetic-Geometric divergence. These three measures bear an interesting relationship among each other. The divergence measures like Hellinger [5...
متن کاملOn a Symmetric Divergence Measure and Information Inequalities
A non-parametric symmetric measure of divergence which belongs to the family of Csiszár’s f -divergences is proposed. Its properties are studied and bounds in terms of some well known divergence measures obtained. An application to the mutual information is considered. A parametric measure of information is also derived from the suggested non-parametric measure. A numerical illustration to comp...
متن کاملBounds on Nonsymmetric Divergence Measure in terms of Other Symmetric and Nonsymmetric Divergence Measures
Vajda (1972) studied a generalized divergence measure of Csiszar's class, so called "Chi-m divergence measure." Variational distance and Chi-square divergence are the special cases of this generalized divergence measure at m = 1 and m = 2, respectively. In this work, nonparametric nonsymmetric measure of divergence, a particular part of Vajda generalized divergence at m = 4, is taken and charac...
متن کاملOn Unified Generalizations of Relative Jensen–shannon and Arithmetic–geometric Divergence Measures, and Their Properties Pranesh Kumar and Inder Jeet Taneja
Abstract. In this paper we shall consider one parametric generalization of some nonsymmetric divergence measures. The non-symmetric divergence measures are such as: Kullback-Leibler relative information, χ2−divergence, relative J – divergence, relative Jensen – Shannon divergence and relative Arithmetic – Geometric divergence. All the generalizations considered can be written as particular case...
متن کامل